4 research outputs found
Classification of Minimal Separating Sets in Low Genus Surfaces
Consider a surface and let . If is not
connected, then we say \emph{separates} , and we refer to as a
\emph{separating set} of . If separates , and no proper subset of
separates , then we say is a \emph{minimal separating set} of . In
this paper we use methods of computational combinatorial topology to classify
the minimal separating sets of the orientable surfaces of genus and
. The classification for genus 0 and 1 was done in earlier work, using
methods of algebraic topology.Comment: 24 pages, 5 figures, 2 tables (11 pages
Learning High-Dimensional Nonparametric Differential Equations via Multivariate Occupation Kernel Functions
Learning a nonparametric system of ordinary differential equations (ODEs)
from trajectory snapshots in a -dimensional state space requires
learning functions of variables. Explicit formulations scale
quadratically in unless additional knowledge about system properties, such
as sparsity and symmetries, is available. In this work, we propose a linear
approach to learning using the implicit formulation provided by vector-valued
Reproducing Kernel Hilbert Spaces. By rewriting the ODEs in a weaker integral
form, which we subsequently minimize, we derive our learning algorithm. The
minimization problem's solution for the vector field relies on multivariate
occupation kernel functions associated with the solution trajectories. We
validate our approach through experiments on highly nonlinear simulated and
real data, where may exceed 100. We further demonstrate the versatility of
the proposed method by learning a nonparametric first order quasilinear partial
differential equation.Comment: 22 pages, 3 figures, submitted to Neurips 202
Learning nonparametric ordinary differential equations from noisy data
Learning nonparametric systems of Ordinary Differential Equations (ODEs) dot
x = f(t,x) from noisy data is an emerging machine learning topic. We use the
well-developed theory of Reproducing Kernel Hilbert Spaces (RKHS) to define
candidates for f for which the solution of the ODE exists and is unique.
Learning f consists of solving a constrained optimization problem in an RKHS.
We propose a penalty method that iteratively uses the Representer theorem and
Euler approximations to provide a numerical solution. We prove a generalization
bound for the L2 distance between x and its estimator and provide experimental
comparisons with the state-of-the-art.Comment: 25 pages, 6 figure
Learning Nonparametric Ordinary Differential Equations: Application to Sparse and Noisy Data
Learning nonparametric systems of Ordinary Differential Equations (ODEs) xË™=f(t,x) from noisy and sparse data is an emerging machine learning topic. We use the well-developed theory of Reproducing Kernel Hilbert Spaces (RKHS) to define candidates for f for which the solution of the ODE exists and is unique. Learning f consists of solving a constrained optimization problem in an RKHS. We propose a penalty method that iteratively uses the Representer theorem and Euler approximations to provide a numerical solution. We prove a generalization bound for the L2 distance between x and its estimator. Experiments are provided for the FitzHugh Nagumo oscillator and for the prediction of the Amyloid level in the cortex of aging subjects. In both cases, we show competitive results when compared with the state of the art